Source coding with escort distributions and Renyi entropy bounds

نویسنده

  • Jean-François Bercher
چکیده

Article history: Received 20 May 2009 Received in revised form 2 July 2009 Accepted 7 July 2009 Available online 15 July 2009 Communicated by A.R. Bishop PACS: 02.50.-r 05.90.+m 89.70.+c

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

On escort distributions, q-gaussians and Fisher information

Escort distributions are a simple one parameter deformation of an original distribution p. In Tsallis extended thermostatistics, the escort-averages, defined with respect to an escort distribution, have revealed useful in order to obtain analytical results and variational equations, with in particular the equilibrium distributions obtained as maxima of Rényi-Tsallis entropy subject to constrain...

متن کامل

Renyi Entropy Estimation Revisited

We revisit the problem of estimating entropy of discrete distributions from independent samples, studied recently by Acharya, Orlitsky, Suresh and Tyagi (SODA 2015), improving their upper and lower bounds on the necessary sample size n. For estimating Renyi entropy of order α, up to constant accuracy and error probability, we show the following Upper bounds n = O(1) · 2(1− 1 α )Hα for integer α...

متن کامل

The Rényi redundancy of generalized Huffman codes

If optimality is measured by average codeword length, Huffman's algorithm gives optimal codes, and the redundancy can be measured as the difference between the average codeword length and Shannon's entropy. If the objective function is replaced by an exponentially weighted average, then a simple modification of Huffman's algorithm gives optimal codes. The redundancy can now be measured as the d...

متن کامل

Entropy of Independent Experiments, Revisited

The weak law of large numbers implies that, under mild assumptions on the source, the Renyi entropy per produced symbol converges (in probability) towards the Shannon entropy rate. This paper quantifies the speed of this convergence for sources with independent (but not iid) outputs, generalizing and improving the result of Holenstein and Renner (IEEE Trans. Inform. Theory, 2011). (a) we charac...

متن کامل

Source Coding with Renyis Entropy

A new measure called average code word length of order is defined and its relationship with Renyi's entropy of order is discussed. Using some coding theorems are proved under the condition Refer ences

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:
  • CoRR

دوره abs/1109.3385  شماره 

صفحات  -

تاریخ انتشار 2009